Mutual information in Gaussian channels

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Mutual Information and Minimum Mean-Square Error in Multiuser Gaussian Channels

1. Introduction Due to the lack of explicit closed form expressions of the mutual information for binary inputs, which were provided only for the BPSK and QPSK for the single input single output (SISO) case, [1], [2], [3], it is of particular importance to address connections between information theory and estimation theory for the multiuser case. Connections between information theory and esti...

متن کامل

Gaussian Process Optimization with Mutual Information

In this paper, we analyze a generic algorithm scheme for sequential global optimization using Gaussian processes. The upper bounds we derive on the cumulative regret for this generic algorithm improve by an exponential factor the previously known bounds for algorithms like GPUCB. We also introduce the novel Gaussian Process Mutual Information algorithm (GP-MI), which significantly improves furt...

متن کامل

Estimating Mutual Information by Local Gaussian Approximation

Estimating Mutual Information by Local Gaussian Approximation Report Title Estimating mutual information (MI) from samples is a fundamental problem in statistics, machine learning, and data analysis. Recently it was shown that a popular class of non-parametric MI estimators perform very poorly for strongly dependent variables and have sample complexity that scales exponentially with the true MI...

متن کامل

Information Capacity of Gaussian Channels

i Information capacity of Gaussian channels is one of the basic problems of information theory. Shannon's results for white Gaussian channels and Fano's "waterfilling" analysis of stationary Gaussian channels are two of the best-known works of early information theory. Results are given here which extend to a general framework these results and others due to Gallager and to Kadota, Zakai, and Z...

متن کامل

Variational Information Maximization in Gaussian Channels

Recently, we introduced a simple variational bound on mutual information, that resolves some of the difficulties in the application of information theory to machine learning. Here we study a specific application to Gaussian channels. It is well known that PCA may be viewed as the solution to maximizing information transmission between a high dimensional vector x and its low dimensional represen...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Multivariate Analysis

سال: 1974

ISSN: 0047-259X

DOI: 10.1016/0047-259x(74)90006-2